AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Weakly Supervised Pre-training

# Weakly Supervised Pre-training

Tapas Large Finetuned Tabfact
Apache-2.0
TAPAS is a BERT-based Transformer model specifically designed for processing tabular data. It is pre-trained on English Wikipedia tables through self-supervised learning and fine-tuned on the TabFact dataset to verify whether a sentence is supported or refuted by table content.
Large Language Model Transformers English
T
google
3,806
4
Tapas Base Finetuned Wikisql Supervised
Apache-2.0
TAPAS is a BERT-based Transformer model specifically designed for table question answering tasks. It is pre-trained in a self-supervised manner on English Wikipedia table data and supports weakly supervised table parsing.
Question Answering System Transformers English
T
google
737
9
Tapas Base Finetuned Wtq
Apache-2.0
TAPAS is a Transformer-based table question answering model, pre-trained on Wikipedia table data through self-supervised learning and fine-tuned on datasets like WTQ.
Question Answering System Transformers English
T
google
23.03k
217
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase